# Table Reasoning

Tessar Largest
MIT
Tessar is an advanced table reasoning model developed by SVECTOR, based on groundbreaking research that continuously expands the boundaries of neural table understanding.
Question Answering System English
T
SVECTOR-CORPORATION
101
1
Protrix
ProTrix is a model designed for table and sentence contextual planning and reasoning, focusing on joint reasoning tasks involving tabular and textual data.
Question Answering System Transformers
P
pkupie
14
0
Reastap Large Finetuned Wikisql
ReasTAP is a table reasoning-based pretrained model that injects table reasoning skills through synthetic reasoning examples and is fine-tuned on the WikiSQL dataset.
Question Answering System Transformers English
R
Yale-LILY
27
1
Reastap Large Finetuned Wtq
ReasTAP is a pre-trained model for table reasoning, which injects table reasoning skills through synthetic reasoning examples and is fine-tuned on the WikiTableQuestions dataset
Question Answering System Transformers English
R
Yale-LILY
66
2
Tapex Large Sql Execution
MIT
TAPEX is a model that achieves table pretraining by learning neural SQL executors, based on the BART architecture, specifically designed for table reasoning tasks.
Large Language Model Transformers English
T
microsoft
68
17
Tapex Large Finetuned Wtq
MIT
TAPEX is a table pre-training model that learns via neural SQL executor, based on the BART architecture, specifically designed for table reasoning tasks.
Question Answering System Transformers English
T
microsoft
2,431
74
Tapex Large
MIT
TAPEX is a model designed for table reasoning tasks, learning through neural SQL executors for table pretraining, based on the BART architecture.
Large Language Model Transformers English
T
microsoft
252
9
Tapex Large Finetuned Wikisql
MIT
TAPEX is a table pre-training model learned through neural SQL executors, based on the BART architecture, specifically designed for table reasoning tasks.
Large Language Model Transformers English
T
microsoft
676
16
Tapex Base
MIT
TAPEX is a table pre-training model that learns through a neural SQL executor, capable of handling table reasoning tasks.
Large Language Model Transformers English
T
microsoft
799
43
Tapex Large Finetuned Tabfact
MIT
TAPEX is a table pre-training model learned through a neural SQL executor, based on the BART architecture, specifically designed for table reasoning tasks.
Large Language Model Transformers English
T
microsoft
2,255
8
Tapex Base Finetuned Tabfact
MIT
TAPEX is a model that learns table reasoning capabilities through neural SQL executor training.
Large Language Model Transformers English
T
microsoft
28
0
Tapas Large
Apache-2.0
TAPAS is a BERT-like model based on the Transformer architecture, specifically designed for processing tabular data and related text. It is pre-trained through self-supervised learning on a massive collection of English Wikipedia tables and associated text.
Large Language Model Transformers English
T
google
211
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase